专利摘要:
The present invention relates to a mobile terminal (100) for efficiently sharing many images and a method of controlling the same. In particular, the present invention relates to a mobile terminal (100) including a touch screen, a memory (170) configured to store a plurality of images, a wireless communication unit (110) configured to transmit-receive data with a plurality of images. a peer terminal and a controller (180) configured to control the touch screen to output an on-line dialog window including a history of transmitted-received messages with the peer terminal, the controller (180), if a of a plurality of the stored images is selected via the on-line dialog window, being configured to control the wireless communication unit (110) to transmit thumbnail images for the selected portion of a plurality of the stored images to the peer terminal .
公开号:FR3031601A1
申请号:FR1554527
申请日:2015-05-20
公开日:2016-07-15
发明作者:Insub Shin;Wooseok Han;Sungho Kim
申请人:LG Electronics Inc;
IPC主号:
专利说明:

[0001] The present invention relates to a mobile terminal enabling a user to more conveniently use the terminal and a method of controlling the same. A mobile terminal is a device that can be configured to perform various functions. Examples of such functions include data and voice communications, image and video capture via camera, audio recording, music file playback, and music output via a speaker system. , and displaying images and video on a display. Some terminals include additional functionality that supports games, while other terminals are also configured as media players. More recently, mobile terminals have been configured to receive broadcast and multicast signals that allow viewing of content, such as videos and television programs. Generally, telininals can be classified into mobile terminals and fixed terminals depending on the presence or absence of mobility. And, the mobile terminals can be further classified into hand-held terminals and vehicle-mounted terminals with a possibility of wearing by hand. Efforts are underway to support and increase the functionality of mobile devices. These efforts include improvements in software and hardware, as well as changes and improvements in the structural components that make up the mobile terminal. Data can be transmitted-received between identified mobile terminals. Users can deliver content such as a text message, an image, a video and the like to a preferred peer using the data. Generally, a text message is delivered via an online dialog window of which a specific terminal is designated as a peer. In order to transmit many images using the online dialogue window, each of the images should be designated one by one. When many images are received using the online dialog window, since it can receive unwanted images regardless of a user's intent, the data resource can be wasted. It is therefore necessary to have a control method capable of efficiently transmitting multimedia contents such as an image and a video. Accordingly, the present invention is directed to an apparatus and method for it which substantially relates to one or more problems due to limitations and disadvantages of the related art. An object of the present invention is to solve the above problem and other problems. Another object of the present invention is to provide a mobile terminal for efficiently transmitting and receiving a large number of images and a control method thereof. The other object of the present invention is to provide a mobile terminal capable of efficiently searching for an image to be transmitted and designating the image and a control method thereof. Additional advantages, objectives, and features of the invention will be set forth in part in the description which follows and in part will be apparent to those skilled in the art upon consideration of the following or may be learned by practice of the invention. . The objects and other advantages of the invention can be realized and attained by the structure particularly emphasized in the written description and its claims as well as in the accompanying drawings. To achieve these objectives and other advantages and in accordance with the object of the invention, as realized and described here in the broad sense, according to one embodiment, a mobile terminal includes a touch screen, a memory configured to store a plurality of images, a wireless communication unit configured to transmit-receive data with a peer terminal and a controller configured to control the touch screen to output an online dialog window including a history of transmitted-received messages with the peer terminal, the controller, if a part of a plurality of the stored images is selected via the online dialog window, configured to control the wireless communication unit to transmit thumbnail images for the selected portion of a plurality of the images stored at the peer terminal.
[0002] To further achieve these objectives and other advantages in accordance with the purpose of the invention, as realized and described herein in a broad sense, according to a different embodiment, a method of controlling a mobile terminal includes the storage steps of a plurality of images in a memory, transmitting-receiving data with a peer terminal, outputting an online dialog window including a history of transmitted-received messages with the peer terminal and if a portion of a The plurality of stored images is selected via the online dialog window, transmitting thumbnail images for the selected portion of a plurality of images stored at the peer terminal. It should be understood that the foregoing general description and the following detailed description of the preferred embodiments of the present invention are both given by way of example and explanation and are intended to provide a further explanation of the invention as claimed.
[0003] The present invention will be more fully understood from the detailed description given below and the accompanying drawings, which are given by way of illustration only, and therefore do not limit the present invention, and in which: FIG. is a block diagram of a mobile terminal in accordance with the present disclosure; FIGS. 1B and 1C are conceptual views of an example of the mobile terminal, seen from different directions; Fig. 2 is a flowchart of a method for efficiently transmitting a plurality of images using an online text message dialogue window 20 according to one embodiment of the present invention; Fig. 3 is a diagram of a first example of easy naming of a plurality of images according to an embodiment of the present invention; Fig. 4 is a diagram of a method of ordering designation of an image to be transmitted based on a category-categorized search result in accordance with an embodiment of the present invention; Fig. 5 is an explanation diagram of a running direction of an online dialog window and a thumbnail list according to an embodiment of the present invention; Fig. 6 is a diagram of a method for controlling reception of an additional thumbnail list according to an embodiment of the present invention in case of receiving a request message of a transmission of a different image. ; Fig. 7 is a diagram of a sort control method of a thumbnail list in a mobile terminal of a requesting side, which wants to share an image, according to an embodiment of the present invention; Fig. 8 is a diagram of a selection control method of a thumbnail from a thumbnail list and requesting an original image for the selected thumbnail according to an embodiment of the present invention; Fig. 9 is a diagram of an image storage control method according to an embodiment of the present invention when a right of access to a computing cloud is set up; Fig. 10 is a diagram of a thumbnail output control method for enlarging the thumbnail in a thumbnail list according to an embodiment of the present invention; Fig. 11 is a diagram of an output control method of the summary information on a thumbnail list according to an embodiment of the present invention; Fig. 12 is a diagram of an image sharing request control method, which is shared between other mobile terminals, according to an embodiment of the present invention; FIG. 13 is a diagram of an automatic change control method of an access right of a cloud server according to an embodiment of the present invention when a mobile tefininal A (100A) of a sharer side exits a group online dialog function; Figs. 14, 15, 16 and 17 are diagrams of an image naming control method based on a face of a specific counterpart according to an embodiment of the present invention; Fig. 18 is a diagram of a control method of capturing an image via an activated camera and naming a plurality of images based on the captured image according to an embodiment of the present invention; Fig. 19 is a diagram of a method of ordering individual designation of an image via a thumbnail list according to an embodiment of the present invention; Fig. 20 is a diagram of a date designation control method according to an embodiment of the present invention; Fig. 21 is a diagram of an image naming control method based on a location at which the image is captured; Figures 22, 23, 24 and 25 are diagrams of an image designation control method based on a location at which the image is captured and a date on which the image is captured according to an embodiment of the present invention; Fig. 26 is a diagram of a method for controlling reception of a search word from a requesting side and searching for an image based on the search keyword according to an embodiment of the present invention. invention; Fig. 27 is a diagram of a command method for designating a search keyword in a received message according to an embodiment of the present invention; Fig. 28 is a diagram of a change control method of an image search condition category according to an embodiment of the present invention; Fig. 29 is a diagram of a switching control method between image search conditions in a switched location category in response to a touch-slip input in a second direction according to an embodiment of the present invention. invention; Fig. 30 is a diagram of a method for controlling the supply of a plurality of images corresponding to a designated image search condition when the image search condition is designated in accordance with an embodiment of the present invention. invention. A detailed description will now be given according to exemplary embodiments disclosed herein with reference to the accompanying drawings. For brevity of description with reference to the drawings, the same or equivalent components may be provided with the same reference numerals, and their description will not be repeated. In general, a suffix such as "module" and "unit" can be used to designate elements or components. The use of such a suffix is here merely intended to facilitate the description of the memory, and the suffix itself is not meant to give any special meaning or function. In the present disclosure, what is well known to those skilled in the art concerned has generally been omitted for the sake of brevity. The accompanying drawings are used to assist in easily understanding various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present disclosure should be construed as extending to any modifications, equivalents and substitutions in addition to those specifically set forth in the accompanying drawings. It will be understood that although the terms first, second, and so on. can be used here to describe various elements, these elements should not be limited by these terms. These terms are generally only used to distinguish one element from another.
[0004] It will be understood that when one element is designated as "connected to" another element, the element may be connected to the other element or intervening elements may also be present. On the other hand, when an element is designated as being "directly connected to" another element, no intervening element is present.
[0005] 15 A singular representation may include a plural representation unless it represents an absolutely different meaning of the context. Terms such as "include" or "have" are used here and should be understood to mean the existence of more than one component, function or step, disclosed in the brief, and it is also understood that it is possible to even use components, functions, or steps in larger or smaller numbers. The mobile terminals presented here can be implemented using a variety of different types of terminals. Examples of such terminals include cell phones, smart phones, user equipment, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable media players (PMPs), browsers, computers portable computers (PC), slate PCs, tablet PCs, ultrabooks, clothing devices (e.g., smart watches, smart glasses, headphones (HMD), and the like.
[0006] By way of nonlimiting example only, a further description will be given with reference to particular types of mobile terminals. Nevertheless, these teachings also apply to other types of terminals, such as types noted above. In addition, these teachings can also be applied to fixed devices such as digital TV, desktops, and the like. Reference is now made to Figs. 1A to 1C, where Fig. 1A is a block diagram of a mobile terminal in accordance with the present invention, and Figs. 1B and 1C are conceptual views of an example of the mobile terminal, given from different directions. The mobile terminal 100 is shown having components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180 and a power source unit 190. It is to be understood that the implementation of all illustrated components is not a requirement, and that larger or smaller components may be implemented as an alternative. smaller number. Referring now to FIG. 1A, the mobile terminal 100 is shown having a wireless communication unit 110 configured with a plurality of jointly implemented components. For example, the wireless communication unit 110 typically includes one or more components that allow wireless communication between the mobile terminal 100 and a wireless communication system or network within which the mobile terminal is located. The wireless communication unit 110 typically includes one or more modules that enable communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal. , communications between the mobile terminal 100 and an external server. In addition, the wireless communication unit 110 typically includes one or more modules that connect the mobile terminal 100 to one or more networks. To facilitate such communications, the wireless communication unit 110 includes one or more of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, and a location information module 115.
[0007] The input unit 120 includes a camera 121 for obtaining images or video, a microphone 122, which is a type of audio input device for inputting an audio signal, and a unit for user input 123 (for example, a touch key, a push button, a mechanical key, a function key and the like) allowing a user to enter information. Data (eg audio, video, image and the like) is obtained by the input unit 120 and can be analyzed and processed by the controller 180 according to device parameters, user commands and combinations thereof .
[0008] The detection unit 140 is typically implemented using one or more sensors configured to detect internal information of the mobile terminal, the surrounding environment of the mobile terminal, user information and the like. For example, in FIG. 1A, the detection unit 140 is shown having a proximity sensor 141 and an illumination sensor 142.
[0009] If desired, the detection unit 140 may alternatively or additionally include other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a sensor G, a sensor gyroscope sensor, motion sensor, RGB sensor, infrared (IR) sensor, finger reader sensor, ultrasonic sensor, optical sensor (e.g. camera 121), microphone 122, battery meter , an environmental sensor (for example a barometer, a hygrometer, a thermometer, a radiation detection sensor, a thermal sensor and a gas sensor, among others), and a chemical sensor (for example an electronic nose, a health care sensor, biometric sensor and the like) to name a few. The mobile terminal 100 may be configured to use information obtained from the detection unit 140, and in particular, information obtained from one or more sensors of the detection unit 140, and combinations thereof. The output unit 150 is typically configured to output various types of information, such as audio, video, touch, and the like. The output unit 150 is shown having a display unit 151, an audio output module 152, a haptic module 153 and an optical output module 154. The display unit 151 may have an interlayer structure or a structure integrated with a touch sensor to facilitate a touch screen. The touch screen may provide an output interface between the mobile terminal 100 and a user, as well as operate as the user input unit 123 which provides an input interface between the mobile terminal 100 and the user.
[0010] The interface unit 160 serves as an interface with various types of external devices that can be coupled to the mobile terminal 100. The interface unit 160, for example, can include any one of wired or unmanaged ports. wire, external power source ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, input / output ports (E / S) audio, video I / O ports, earphone ports, and the like. In some cases, the mobile terminal 100 may perform various control functions associated with a connected external device, in response to the connection of the external device to the interface unit 160.
[0011] The memory 170 is typically implemented to store data to support various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100 and the like. Some of these application programs can be downloaded from an external server via wireless communication. Other application programs may be installed within the mobile teuninal 100 at the time of manufacture or shipment, which is typically the case for basic functions of the mobile terminal 100 (for example, receiving a call, make a call, receive a message, send a message and the like). It is common for the application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100. The controller 180 typically operates to control the overall operation of the mobile terminal 100, in addition to the operations associated with the application programs. The controller 180 may provide or process information or functions appropriate for a user by processing signals, data, information and the like, which are inputted or outputted by the various components shown in FIG. 1A, or activate programs. application 30 stored in the memory 170. By way of example, the control member 180 controls all or part of the components illustrated in FIGS. 1A to 1C according to the execution of an application program that has been stored in the memory 170.
[0012] The power source unit 190 may be configured to receive an external power supply or provide an internal power supply to provide an appropriate power required to operate the elements and components included in the mobile terminal 100. The source unit The power supply 190 may include a battery, and the battery may be configured to be embedded in the terminal body, or configured to be detachable from the terminal body. Referring again to FIG. 1A, various components shown in this figure will now be described in greater detail. With respect to the wireless communication unit 110, the broadcast receiving module 111 is typically configured to receive a broadcast signal and / or information associated with the broadcast from an external broadcast management entity over a broadcast channel. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules 111 may be used to facilitate simultaneous reception of two or more broadcast channels, or support switching among broadcast channels. The mobile communication module 112 can transmit and / or receive wireless signals to and from one or more network entities. Typical examples of a network entity include a base station, an external mobile terminal, a server and the like. Such network entities are part of a wireless communication network, which is constructed according to technical standards or standards. communication methods for mobile communications (e.g., the global mobile communications system (GSM), code division multiple access (CDMA), CDMA 2000 (Multi Access Code Division 2000), EV-25 DO (voice communication) / Enhanced and Optimized Data / Only Enhanced Voice / Data Communication), Broadband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), HSUPA (High Speed Uplink Packet Access) Long term evolution (LTE), LTE-A (enhanced long term evolution), and the like.) Examples of wireless signals transmitted and / or received via the mobile communication module 112 include audible call signals. io, video calling signals (telephony) or various data formats for supporting text and multimedia message communication.
[0013] The Wireless Internet Module 113 is configured to facilitate wireless Internet access. This module can be coupled to the mobile terminal 100 internally or externally. The wireless Internet module 113 can transmit and / or receive wireless signals via communication networks according to wireless Internet technologies.
[0014] Examples of such wireless Internet access include Wireless Local Area Network (WLAN), Wi-Fi (Wi-Fi), Wi-Fi Direct, Digital Living Network Alliance (DLNA), wireless Internet. Broadband (WiBro), Worldwide Interoperability for Microwave Access (WiMAX), High Speed Downlink Packet Access (HSDPA), HSUPA (High Throughput Uplink Packet Access) 10 Long Term Evolution (LTE), LTE-A (enhanced long-term evolution), and the like. The wireless Internet module 113 can transmit / receive data according to one or more of these wireless Internet technologies, as well as other Internet technologies. In some embodiments, when wireless Internet access is implemented according to, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A and the like, as part of a network Mobile communication module, the wireless Internet module 113 achieves such access to wireless Internet. As such, the wireless Internet module 113 can cooperate with, or function as, the mobile communication module 112.
[0015] The short-range communication module 114 is configured to facilitate short-range communications. Suitable technologies for implementing such short-range communications include BLUETOOTHTm, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication. (NFC), Wi-Fi (Wi-Fi), Wi-Fi Direct, Wireless USB (Universal Serial Bus), and the like. The short-range communication module 114 generally supports wireless communications between the Terniinal mobile 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the mobile terminal and a network where another mobile terminal 100 (or an external server) is located, via wireless LANs. An example of wireless local area networks is a personal wireless LAN.
[0016] In some embodiments, another mobile terminal (which may be configured similarly to the mobile terminal 100) may be a clothing device, for example a smart watch, smart glasses or a headset (HMD), which can exchange data with the mobile terminal 100 (or otherwise cooperate with the mobile terminal 100). The short-range communication module 114 can detect or recognize the clothing device, and allow communication between the clothing device and the mobile terminal 100. In addition, when the detected clothing device is a device that is authenticated to communicate with the mobile terminal 100, the controller 180, for example, can cause processed data to be transmitted in the mobile terminal 100 to the clothing device via the short-range communication module 114. From there, a user of the clothing device can use the data. processed in the mobile terminal 100 on the clothing device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the dress device. Similarly, when a message is received in the mobile terminal 100, the user can check the received message using the clothing device. The location information module 115 is generally configured to detect, calculate, derive or otherwise identify a position of the mobile terminal. For example, the location information module 115 includes a global positioning system (GPS) module, a Wi-Fi module, or both. If desired, the location information module 115 may alternatively or additionally operate with any of the other modules of the wireless communication unit 110 to obtain data relating to the position of the mobile terminal.
[0017] By way of example, when the mobile terminal uses a GPS module, a position of the mobile terminal can be acquired using a signal sent from a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information relating to a wireless access point (AP) that transmits or receives a mobile terminal. wireless signal to or from the Wi-Fi module. The input unit 120 may be configured to allow various types of input into the mobile terminal 120. Examples of such inputs include audio, picture, video input , data and user. An image and video input is often obtained using one or more cameras 121. Such cameras 121 can process still image or video image frames obtained by image sensors in a capture mode. video or picture. The processed image frames may be displayed on the display unit 151 or stored in the memory 170.
[0018] In some cases, the cameras 121 may be arranged in a matrix configuration to allow a plurality of images having various angles or focal points to be input to the mobile terminal 100. As another example, the cameras 121 can be located in a stereoscopic arrangement for acquiring left and right images for implementing a stereoscopic image.
[0019] The microphone 122 is generally implemented to allow audio input into the mobile terminal 100. The audio input can be processed in various ways according to a function performed in the mobile terminal 100. If desired, the microphone 122 may include algorithms various noise suppressors to suppress unwanted noise generated during the reception of external audio.
[0020] The user input unit 123 is a component that allows input by a user. Such user input may allow the controller 180 to control the operation of the mobile terminal 100. The user input unit 123 may include one or more of a mechanical input element (e.g. a touch , a button located on a front and / or rear surface or a side surface of the mobile terminal 100, a dome switch, a hand wheel, a hand switch and the like), or a touch sensitive input, among others. For example, the touch sensitive input may be a virtual key or a function key, which is displayed on a touch screen by software processing, or a touch key which is located on the mobile terminal at a location 25 which is other than on the touch screen. In addition, the virtual key or the visual key can be displayed on the touch screen in various forms, for example graphic, text, icon, video or one of their combinations. The detection unit 140 is generally configured to detect one or more of internal information of the mobile terminal, information of the surrounding environment of the mobile terminal, user information or the like. The controller 180 generally cooperates with the sending unit 140 to control the operation of the mobile terminal 100, or to execute a data processing, function or operation associated with an application program installed in the terminal. The detection unit 140 may be implemented using any of a variety of sensors, some of which will now be described in more detail.
[0021] The proximity sensor 141 may include a sensor for detecting the presence or absence of an object approaching a surface, or an object located near a surface, using an electromagnetic field, infrared rays, or the like without mechanical contact. The proximity sensor 141 may be arranged at an inner region of the mobile terminal covered by the touch screen, or near the touch screen. The proximity sensor 141 may for example include any one of a transmissive sensor type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror reflecting type of photoelectric sensor, a proximity sensor to high frequency oscillation, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared proximity sensor, and the like. When the touch screen is implemented as a capacitance type, the proximity sensor 141 can detect the proximity of a pointer to the touch screen by changes in an electromagnetic field, which responds to the approach of a object having a conductivity. In this case, the touch screen (touch sensor) can also be categorized as a proximity sensor. The term "proximity touch" will often be used here to refer to the scenario in which a pointer is positioned to be close to the touch screen without coming into contact with the touch screen. The term "contact touch" will often be used here to refer to the scenario in which a pointer makes physical contact with the touch screen. For the position corresponding to the proximity touch of the pointer relative to the touch screen, such a position will correspond to a position where the pointer is perpendicular to the touch screen. The proximity sensor 141 can detect a proximity touch, and proximity touch patterns (e.g., distance, direction, speed, time, position, travel status, and the like). In general, controller 180 processes data corresponding to proximity taps and proximity tap patterns detected by proximity sensor 141, and outputs visual information to the touch screen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or process different data depending on whether a touch with respect to a point on the touch screen is either a proximity touch or a touch. contact. A touch sensor can detect a touch applied to the touch screen, such as the display unit 151, using any of a variety of touching methods. Examples of such touch methods include a resistive type, a capacitive type, an infrared type, and a magnetic field type, among others.
[0022] As an example, the touch sensor may be configured to convert pressure changes applied to a specific portion of the display unit 151, or to convert a capacitance occurring at a specific portion of the display unit. 151, as electrical input signals. The touch sensor may also be configured to detect not only an affected position and an affected area, but also touch pressure and / or touch ability. A touch object is generally used to apply a touch input to the touch sensor. Examples of typical touch objects include a finger, a pencil, a stylus, a pointer, or the like. When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch controller. The touch controller can process the received signals and then transmit corresponding data to the controller 180. Accordingly, the controller 180 can detect which region of the display unit 151 has been touched. Here, the touch controller may be a separate component of the controller 180, combined with the controller 180, and combinations thereof. In some embodiments, the controller 180 may perform the same or different commands depending on a type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. It can be decided whether to execute the same command or a different command depending on the object that provides tactile input based on a current operating state of the mobile teiminal 100 or an application program currently running, for example.
[0023] 3031601 16 The touch sensor and proximity sensor can be implemented individually, or in combination, to detect various types of touch. Such touches include a short touch, a long touch, a multiple touch, a sliding feel, a light touch, a pinch to the inside, a pinch to the outside, a touch by scanning, a touch of pointing and the like. If desired, an ultrasonic sensor may be implemented to recognize positional information relating to a touch object using ultrasonic waves. The controller 180 may for example calculate a position of a wave generation source based on information detected by an illumination sensor and a plurality of ultrasonic sensors. Since the light is much faster than the ultrasonic waves, the time required for the light to reach the optical sensor is much shorter than the time required for the ultrasonic wave to reach the ultrasonic sensor. The position of the wave generation source can be calculated using this fact. For example, the position of the wave generation source can be calculated using the time difference from the time required for the ultrasonic wave to reach the sensor based on light as a reference signal. The camera 121 typically includes at least one camera sensor (CCD, CMOS, etc.), a photosensitive sensor (or image sensors) and a laser sensor. The implementation of the camera 121 with a laser sensor can allow the detection of a touch of a physical object with respect to a stereoscopic image in 3D. The photosensitive sensor may be laminated on, or overlapped by, the display device. The photosensitive sensor may be configured to scan the movement of the physical object near the touch screen. In more detail, the photosensitive sensor may include photodiodes and transistors at rows and columns for scanning the content received at the photosensitive sensor using an electrical signal that changes depending on the amount of light applied. Namely, the photosensitive sensor can calculate the coordinates of the physical object according to a variation of light to thereby obtain position information of the physical object. The display unit 151 is generally configured to output processed information in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program executing at the mobile terminal 100 or the user interface (UI) and graphical user interface (GUI) information in response to the execution screen information.
[0024] In some embodiments, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A typical stereoscopic display unit may employ a stereoscopic display scheme such as a stereoscopic scheme (a scheme with glasses), an autostereoscopic scheme (a scheme without glasses), a projection scheme (holographic scheme) or the like. The audio output module 152 is generally configured to output audio data. Such audio data may be obtained from any one of a number of different sources, so that audio data may be received from the wireless communication unit 110 or may have been stored in the memory 170. The audio data may be output during modes such as a signal receiving mode, a calling mode, a recording mode, a speech recognition mode, a broadcast receiving mode and the like The audio output module 152 may provide an audible output relating to a particular function (e.g., a receive signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal 100. The module audio output 152 may also be implemented as a receiver, a speaker, a buzzer or the like. A haptic module 153 may be configured to generate various tactile effects that a user feels, perceives or otherwise experiences. A typical example of a tactile effect generated by the haptic module 153 is a vibration. The intensity, pattern, and the like of the vibration generated by the haptic module 153 may be controlled by a user selection or by adjustment by the control unit. For example, the haptic module 153 can output different vibrations in a combined or sequential manner.
[0025] In addition to the vibration, the haptic module 153 can generate various other tactile effects, including a stimulating effect such as a vertically movable spindle arrangement for contacting the skin, a spraying force, or an air suction force. through a jet orifice or suction opening, a skin feel, an electrode contact, an electrostatic force, an effect reproducing the sensation of cold and heat by using an element that can absorb or generate heat, and the like. The haptic module 153 may also be implemented to allow the user to experience a tactile effect through a muscular sensation such as the user's fingers or arm, as well as by transferring the tactile effect. by direct contact. Two or more haptic modules 153 may be provided according to the particular configuration of the mobile terminal 100. An optical output module 154 may output a signal to indicate event generation using light from a light source . Examples of events generated in the mobile terminal 100 may include message reception, call waiting, a missed call, an alarm, a calendar announcement, an e-mail reception, the receipt of information to through an application, and the like.
[0026] A signal outputted from the optical output module 154 may be implemented for the mobile terminal to emit monochromatic light or light of a plurality of colors. The signal output can be terminated when the mobile terminal detects that a user has verified the generated event, for example. The interface unit 160 serves as an interface for external devices to connect to the mobile terminal 100. For example, the interface unit 160 can receive data transmitted from an external device, receive power for transfer to elements and components within the mobile terminal 100, or transmit internal data of the mobile terminal 100 to such an external device. The interface unit 160 may include wired or wireless headphone ports, external power source ports, wired or wireless data ports, memory card ports, ports for connection to a computer. a device having an identification module, audio input / output (I / O) ports, video I / O ports, headphone ports, or the like. The identification module may be a chip which stores various information for authenticating the right of use of the mobile terminal 100 and may include a user identity module (UIM), a subscriber identity module (SIM) ), a universal subscriber identity module (USIM) and the like. In addition, the device comprising the identification module (also referred to herein as "identifying device") may take the form of a smart card. As a result, the identifier device can be connected to the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected to an external cradle, the interface unit 160 can be used as a passageway to enable the delivery of the cradle. supplying the cradle to the mobile terminal 100 or may be used as a passageway for transferring various order signals entered by the user from the cradle to the mobile terminal. Various command signals or the power input from the cradle may function as signals to recognize that the mobile terminal is properly mounted on the cradle. The memory 170 may store programs to support operations of the controller 180 and store input / output data (eg, phone book, messages, still images, videos, etc.). The memory 170 can store data relating to various vibration and audio patterns that are output in response to touch inputs on the touch screen. The memory 170 may include one or more types of storage media including a flash memory, a hard disk, a semiconductor disk, a silicon disk, a micro type of multimedia card, a memory card (for example memory SD or DX, etc.), random access memory (RAM), static random access memory (SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, a magnetic disk, an optical disk and the like. The mobile terminal 100 may also be operated in connection with a network storage device that performs the storage function of the memory 170 on a network, such as the Internet. The controller 180 may typically control the general operations of the mobile terminal 100. For example, the controller 180 may set or release a lockout state to restrict a user's entry of a control order relative to to applications when a status of the mobile terminal satisfies a pre-established condition. The controller 180 may also perform the control and processing associated with voice calls, data communications, video calls, and the like, or perform pattern recognition processing to recognize a handwritten input or a drawn input. performed on the touch screen in the form of characters (or word) or images, respectively. In addition, the controller 180 may control a component or combination of these components to implement various embodiments disclosed herein.
[0027] The power source unit 190 receives an external power supply or provides an internal power supply and provides the appropriate power required to operate the respective elements and components included in the mobile terminal 100. The power source unit 190 can include a battery, which is typically rechargeable or is detachably coupled to the terminal body for charging. The power source unit 190 may include a connection port. The connection port may be configured as an example of the interface unit 160 to which an external charger to provide power to recharge the battery is electrically connected.
[0028] As another example, the power source unit 190 may be configured to recharge the wireless battery without the use of the connection port. In this example, the power source unit 190 can receive power, transferred from an external wireless power transmitter, using one of an inductive coupling method that is based on magnetic induction. or a magnetic resonance coupling method which is based on electromagnetic resonance. Various embodiments described herein may be implemented in a computer readable medium, a machine readable medium or a similar medium using, for example, software, hardware, or a combination thereof.
[0029] Referring now to FIGS. 1B and 1C, the mobile terminal 100 is described with reference to a body of the bar terminal. However, the mobile terminal 100 may alternatively be implemented in any of a variety of different configurations. Examples of such configurations include watch type, fastener type, spectacle type, or as collapsible type, flapper type, sliding type, tilting type, and swivel type in which two and more than two bodies are combined together. to others in a relatively mobile way, and their combinations. The present discussion is often concerned with a particular type of mobile terminal (for example, a bar type, a watch type, a spectacle type and the like).
[0030] Nevertheless, such teachings as to a particular type of mobile terminal will also generally apply to other types of mobile terminals. The mobile terminal 100 will generally include a housing (for example a frame, a housing, a shell and the like) forming the appearance of the terminal. In this embodiment, the housing is formed using a front housing 101 and a rear housing 102. Various electronic components are incorporated in a space formed between the front housing 101 and the rear housing 102. less a middle case can furthermore be positioned between the front case 101 and the rear case 102. The display unit 151 is shown located on the front side of the terminal body for outputting information. As illustrated, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. In some embodiments, electronic components may also be The examples of such electronic components include a detachable battery 191, an identification module, a memory card and the like. The back shell 103 is shown covering the electronic components, and this shell can be coupled in a manner that As a result, when the rear shell 103 is detached from the rear housing 102, the electronic components mounted on the rear housing 102 are exposed to the outside. As illustrated, when the rear shell 103 is coupled to the rear housing 102, a side surface of the rear housing 102 is partially exposed. In some cases, during coupling, the rear case 102 may be completely obscured by the back shell 103. In some embodiments, the back shell 103 may include an opening for exposing a camera 121b or audio output module 152b. The housings 101, 102, 103 may be formed by injection molding of a synthetic resin or may be formed of a metal, for example stainless steel (AI), aluminum (Al), titanium (Ti) or similar.
[0031] As an alternative to the example in which the plurality of housings form an internal space for housing components, the mobile terminal 100 may be configured such that a housing forms the internal space. In this example, a mobile terminal 100 having a monocoque is formed so that a synthetic resin or metal extends from a side surface to a back surface. If desired, the mobile terminal 100 may include a water sealing unit to prevent the introduction of water into the terminal body.
[0032] For example, the water sealing unit may include a water seal which is located between the window 151a and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102 and the back shell 103, to hermetically seal an internal space when these housings are coupled.
[0033] Figures 1B and 1C show some components as arranged on the mobile terminal. However, it should be understood that alternative arrangements are possible and fall within the teachings of the present disclosure. Some components may be omitted or rearranged. For example, the first handling unit 123a may be located on another surface of the terminal body, and the second audio output module 152b may be located on the side surface of the terminal body. The display unit 151 outputs information processed in the mobile terminal 100. The display unit 151 may be implemented using one or more suitable display devices. Examples of such suitable display devices include a liquid crystal display (LCD), a thin-film transistor (TFT-LCD) liquid crystal display, an organic light-emitting diode (OLED), a flexible display, a display in 3 dimensions (3D), an electronic ink display, and their combinations. The display unit 151 may be implemented using two display devices, which may implement the same display technology or a different display technology. For example, a plurality of display devices 151 may be arranged on one side, spaced from each other, or these devices may be integrated, or these devices may be arranged on different surfaces.
[0034] The display unit 151 may also include a touch sensor that detects a touch input received at the display unit. When a touch is input to the display unit 151, the touch sensor may be configured to detect that touch and the controller 180 may for example generate a control command or other signal corresponding to the touch. Content that is entered by touch can be a text or numeric value, or a menu item that can be specified or designated in various modes. The touch sensor may be configured as a film having a touch pattern disposed between the window 151a and a display on a rear surface of the window 151a, or a metal wire that is patterned directly on the back surface. from the window 151a. Alternatively, the touch sensor can be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or within the display.
[0035] The display unit 151 may also form a touch screen in conjunction with the touch sensor. Here, the touch sensor can serve as a user input unit 123 (see Figure 1A). As a result, the touch screen can replace at least a portion of the functions of the first handling unit 123a. The first audio output module 152a may be implemented as a speaker to output voice audio, alarm sounds, multimedia audio reproduction, and the like. The window 151a of the display unit 151 will typically include an aperture to allow the audio generated by the first audio output module 152a to pass. An alternative is to allow the release of audio along an assembly gap 20 between the structural bodies (for example a gap between the window 151a and the front housing 101). In this case, an independently formed hole for outputting audio sounds may not be seen or otherwise obscured in appearance, further simplifying the appearance and fabrication of the mobile terminal 100. The optical output module 154 may configured to exit the light 25 to indicate the generation of an event. Examples of such events include message reception, call waiting reception, missed call, alarm, calendar announcement, email reception, information receipt through an application, and the like. When a user has verified a generated event, the control unit can control the optical output module 154 to stop the light output. The first camera 121a can process image frames as well as still or moving images obtained by the image sensor in a capture mode or a video call mode. The processed image frames may then be displayed on the display unit 151 or stored in the memory 170. The first and second handling units 123a and 123b are examples of the user input unit 123, which may be manipulated by a user to provide an input to the mobile terminal 100. The first and second handling units 123a and 123b may also be commonly referred to as a handling portion, and may employ any touch method that allows the user to realize manipulation such as touch, push, scroll, or the like. The first and second handling units 123a and 123b may also employ any non-tactile method that allows the user to perform manipulation such as proximity touch, pointing, or the like. Figure 1B illustrates the first handling unit 123a as a touch key, but possible alternatives include a mechanical key, a push button, a touch key, and combinations thereof. An input received at the first and second handling units 123a and 123b may be used in a variety of ways. For example, the first handling unit 123a may be used by the user to provide menu entry, a start, cancel, search, or the like key, and the second 123b handling unit may be used by the user. to provide an input for controlling a volume level that is outputted by the first or second audio output module 152a or 152b, to switch to a touch recognition mode of the display unit 151, or the like. As another example of the user input unit 123, a rear input unit may be located on the rear surface of the terminal body. The rear input unit may be manipulated by a user to provide an input to the mobile terminal 100. The input may be used in a variety of different ways. For example, the rear input unit may be used by the user to provide on / off input, start, end, scroll, volume level control outputted from the first or second audio output module 152a or 152b, switching to a touch recognition mode of the display unit 151, and the like. The rear input unit can be configured to allow touch input, push input, or combinations thereof.
[0036] The rear input unit may be located to overlap the display unit 151 on the front side in a direction of the thickness of the terminal body. For example, the rear input unit may be located on an upper end portion of the back side of the terminal body so that a user can easily manipulate it using an index when the user grips the terminal body with one hand. Alternatively, the rear input unit may be positioned at almost any location on the rear side of the terminal body. Embodiments that include the rear input unit may implement all or part of the functionality of the first handling unit 123a in the rear input unit. As such, in situations where the first handling unit 123a is omitted from the front side, the display unit 151 may have a larger screen. As a further alternative, the mobile terminal 100 may include a finger reader sensor that scans a user's fingerprint. The controller 180 may then use fingerprint information detected by the finger reader sensor as part of an authentication procedure. The finger reader sensor may also be installed in the display unit 151 or implemented in the user input unit 123. The microphone 122 is shown at one end of the mobile terminal 100, but other locations are possible. If desired, multiple microphones may be implemented, such an arrangement for receiving stereo sounds. The interface unit 160 may serve as a path for the mobile terminal 100 to interface with external devices. For example, the interface unit 160 may include one or more of a connection terminal for connection to another device (e.g., a listener, an external speaker, or the like), a port for a field communication near (for example, an infrared data association port (IrDA), a Bluetooth port, a wireless LAN port and the like), or a power source terminal for supplying the mobile terminal 100 with power. interface unit 160 may be implemented in the form of a card for receiving an external card, such as a subscriber identification module (SIM), a user identity module (UIM), or a memory card for storing information.
[0037] The second camera 121b is shown located at the rear side of the terminal body and includes an image capture direction that is substantially opposite to the image capture direction of the first camera unit 121a. If desired, the second camera 121a may alternatively be located at other locations, or may be moveable, to have an image capture direction different from that shown. The second camera 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. The cameras may be referred to as "network camera". When the second 12 lb camera is implemented as a network camera, images can be captured in a variety of ways using the plurality of lenses and images with better qualities. As shown in FIG. 1C, a flash 124 is shown adjacent to the second camera 121b. When an image of a subject is captured with the camera 121b, the flash 124 may illuminate the subject. As shown in FIG. 1B, the second audio output module 152b may be located on the body of the terminal. The second audio output module 152b may implement stereophonic sound functions together with the first audio output module 152a, and may also be used to implement a speakerphone mode for call communication. At least one antenna for wireless communication may be located on the body of the antenna. The antenna may be installed in the body of the terminal or formed by the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 may be retractable into the body of the teiminal. Alternatively, an antenna may be formed using a film attached to an inner surface of the back shell 103, or a housing that includes a conductive material. A power source unit 190 for supplying power to the mobile terminal 100 may include a battery 191, which is mounted in the terminal body or detachably coupled to an exterior of the terminal body. The battery 191 can receive power via a power source cable connected to the interface unit 160. Also, the battery 191 can be wirelessly recharged using a wireless charger. The wireless load can be implemented by magnetic induction or electromagnetic resonance.
[0038] The back shell 103 is shown coupled to the rear housing 102 to obscure the battery 191, to prevent separation of the battery 191, and to protect the battery 191 from external impact or a foreign object. . When the battery 191 is detachable from the terminal body, the back box 103 can be detachably coupled to the back box 102. An accessory to protect an aspect or to assist or extend the functions of the mobile terminal 100 may also be provided. It can be arranged on the mobile terminal 100. As an example of an accessory, a hull or case for covering or accommodating at least one surface of the mobile terminal 100 can be arranged. The shell or holster can cooperate with the display unit 151 to extend the function of the mobile terminal 100. Another example of the accessory is a touch pen for assisting or extending a touch input to a terminal. touchscreen. Additional preferred embodiments will be described in more detail with reference to additional drawing figures. Those skilled in the art understand that the present features can be realized in a number of forms without departing from their characteristics. In the case of a general message transmission / reception application, the application supports the transmission and reception of multimedia contents such as a still image (hereinafter referred to as an image), a video and the like. The message transmission-reception application may transmit a text message or content using an online dialog window in which at least one or more external terminals are configured as peer terminals. An online dialog window outputs a history of sent-received messages with a peer terminal. In this case, the message history output can be scrolled in a specific direction. However, it may be convenient for a user to transmit a large amount of content using an online dialog window of general form. In the case of transmission and reception of a plurality of images, a history of transmission and reception of a plurality of the images is included in the history of the transmission and reception of the dialogue window. Online and previously sent and received messages are scrolled up.
[0039] With respect to a receiving peer terminal receiving a plurality of the images, the terminal is unable to select whether to receive an undesired image. From there, a data resource can be wasted. Accordingly, an embodiment of the present invention provides a control method capable of solving the above problem and more efficiently transmitting many contents. Fig. 2 is a flowchart of a method for efficiently transmitting a plurality of images using an on-line text message dialogue window according to an embodiment of the present invention. Fig. 3 is a diagram of a first example of easy naming of a plurality of images according to an embodiment of the present invention. In the following, the present invention is explained with reference to Figure 2 and Figure 3 together. In step S201, the controller 180 identifies a peer terminal. In embodiments of the present invention described hereinafter, assume that a plurality of terminals 100 appear and that a message can be sent-received between the mobile terminals 100 so that the terminals identify themselves. each other. In the embodiments described below, each terminal can be classified as a mobile terminal A, a mobile terminal B, a mobile terminal C, etc. A configuration of each mobile terminal is represented by an alphabetical letter according to the configuration. For example, a controller of a mobile terminal A is shown at 180A. In step S202, the controller 180 executes a message transmission-reception application and can then be capable of transmitting and receiving a message with an identified peer terminal.
[0040] Figures 3 (a) through (c) are diagrams of an execution state of a mobile terminal A 100A of a sharing side wishing to share a plurality of images with a different peer terminal. In step S203, the controller 180A may output an online dialog window 301 including a history of messages sent-received via a touch screen 151A. Referring to an example shown in Fig. 3, the online dialog window 301 corresponds to a group online dialog window. Suppose that different users (Requester 1 and Requester 2) and a repository wishing to transmit 3031601 prescribed images to the different users join an online group dialogue function. The group online dialog window indicates a screen for transmitting and receiving a message with a plurality of peer terminals. The group online dialog window may include a text entry window for receiving a history entry of a sent-received message with a plurality of peer terminals and a transmission message. Referring to an example shown in Figure 3, it is assumed that a user (sharer) of a mobile terminal 100A, a first peer (requestor 1) and a second peer (requestor 2) join a callback function. group line. Transmission messages 311-1 to 3113 and receive messages 310-1 / 310-2 are output in a history 302 of transmitted-received messages. In this case, each of the transmission messages and the reception messages included in the history 302 of the transmitted-received messages can be output in order to be distinguished from each other. For example, as shown in Figure 3, the transmission messages are displayed on the right side of the online dialog window so as to be arranged and the reception messages are displayed on the left side of the online dialog window so to be arranged, whereby the present invention may not be limited. In step S204, the controller 180A selects (designates) a portion of images stored in a memory 170 or a cloud server by a selection (designation) of a user. In embodiments relating to Fig. 3 and Fig. 4, a first example of easy designation of an image to be shared among a plurality of images is explained. The first example proposes to search for a favorite image using a search keyword. As shown in Fig. 3 (b), if a long touch input 10a of a button 303 (hereinafter referred to as an image designation button) for designating an output image on a group online dialog window 301 is received, the controller 180A can output a search keyword entry window 307 and a virtual keypad 306 on the touch screen 151A. If the prescribed keyword (Jeju Island) is entered in the search keyword entry window 307601, the controller 180A can search for images stored in the memory 170 based on the keyword. key entered. At the same time, in the case of an output of a search result 308, an embodiment of the present invention provides an efficient output method of the search result instead of simply listing the search result. The controller 180A according to an embodiment of the present invention can output the number of images, which are searched according to a category. Referring to FIG. 3 (c), controller 180 outputs the number 305-1 of searched images from images stored in memory 170 by the search keyword along with an image icon. image 304-1 so as to distinguish the images stored in the memory 170 of a video (film). And, the controller 180 can output the number 305-2 of videos searched from videos stored in the memory 170 by the search keyword along with a video icon 304-1.
[0041] In addition, according to one embodiment of the present invention, the controller may search images stored in a locked WebDrive server with a mobile terminal 100 instead of the memory 170 and may then be able Include a result in the search result 308. Referring to an example shown in Figure 3, an example of the cloud server includes Google Drive, N Drive, and Box. In particular, the number 305-3 of searched images / videos displayed on a first computer cloud icon 304-3 corresponding to Google Drive, the number 305-4 of searched images / videos displayed on a second computer cloud icon 304 -4 corresponding to N Drive and the number 305-5 of images / videos 25 searched displayed on a third computer cloud icon 304-5 corresponding to Box may be included in the search result 308. The search result 308 may include in addition the total number of searched images / videos and an icon 304-6 thereof. In an embodiment of the present invention described hereinafter, a search result 308 may have a shape identical to a form mentioned earlier in FIG. 3. In the following, a method of ordering a designation of an image using a search result 308 with reference to FIG. 4.
[0042] Figure 4 is a diagram of a method of ordering designation of an image to be transmitted based on a category-ranked search result in accordance with an embodiment of the present invention. Referring to Figs. 4 (a) and (b), the search result 308 mentioned earlier in Fig. 3 is taken out. According to embodiments described in FIGS. 4 (a) and (b), it is possible to determine whether the search result is transmitted to all the peer terminals belonging to an on-line dialogue window or a part of the peer terminals belonging to the dialog window based on a position where an icon is dragged and dropped.
[0043] As shown in Fig. 4 (a), if a prescribed icon 304-6 is touched and dropped to a prescribed position 403 of a group line dialog 301, the controller 180 may transmit images of search result corresponding to the prescribed icon 304-6 to all peer terminals. An example of the prescribed position 403 may correspond to a bottom area of the online dialogue window. In contrast, as shown in Fig. 4 (b), if a prescribed icon 304-6 is touched and slid 10c to a prescribed receive message 310-2 of the group line dialog 301, the controller 180 may transmit search result images corresponding to the prescribed icon 304-6 to a peer terminal corresponding to a main transmission entity of the receive message 310-2 only. As shown in the example, if many images are transmitted, it may be possible to waste a data resource for an unwanted image by a receiver side. In particular, if billing is carried out according to a quantity of data usage, the waste of a data resource may therefore lead to an increase in communication costs. Hence, an embodiment of the present invention proposes to transmit a thumbnail first [S205] for an image instead of the image itself. Subsequently, if a request for a preferred image is received from a peer terminal, which has verified the transmitted thumbnail, the image itself can be transmitted to the peer terminal in response to the request. This will be explained in detail later.
[0044] In step S206, the controller 180A may output a thumbnail list 400 for the thumbnail image transmitted in the group line dialog window 301. At the same time, according to a different embodiment of In the present invention, when the thumbnail list 400 is outputted in the group online dialog window 301, a confirmation button 402 is further output to confirm whether to transmit a thumbnail image. After confirmation by the confirmation button 402, the thumbnail image of step S205 can be transmitted. The thumbnail list 400 displays a first thumbnail 401-1, a second thumbnail 401-2 and a third thumbnail 401-3. Other thumbnails can be scrolled out. The scroll operation will be described later. At the same time, the thumbnail list 400 may preferentially display an image in which a face of a requestor is included (displaying the image before a different article). If there are many images in which a face 15 is included, one may be able to primarily exit an image in which a face of a plaintiff is roughly (an image on which an area of a face is large) . A sharer of an image may transmit a prescribed image to select or deselect the image prescribed on the thumbnail list 400. If a prescribed touch gesture 10d is received on the third thumbnail 401-3, the 20 command 180A can eliminate the third thumbnail 401-3 from the thumbnail list to transmit the third thumbnail 401-3. If a user selects or deselects a thumbnail, determines a thumbnail list to be transmitted and selects the confirmation button 402, the controller 180A may transmit thumbnail images according to the thumbnail list determined at the peer terminal (in the case of a thumbnail). Figure 4 (a), all peer terminals In the case of Figure 4 (b), a specific peer terminal) [S205]. In the following, we explain an online dialogue window according to an embodiment of the present invention and a scrolling direction of a thumbnail list with reference to FIG. 5.
[0045] Fig. 5 is an explanation diagram of a scrolling direction of an online dialog window and a thumbnail list according to an embodiment of the present invention.
[0046] An objective of an embodiment of the present invention relating to Figure 5 is to give minimal impact to an online dialogue history displayed on an online dialog window although many images are shared with each other. others. This is because, if many images are shared with each other, a previously released online dialogue history is scrolled up. In particular, in order to check the previous online dialog history, it may be necessary to scroll a considerable amount. As a result, although many images are transmitted, an embodiment of the present invention proposes to display the many images in a single text bubble and scroll through the images in the text bubble. In particular, an embodiment of the present invention proposes that a scrolling direction in the text bubble becomes a different direction of a scrolling direction of an online dialog window.
[0047] FIGS. 5 (a) to (c) are diagrams of an execution state of a message transmission / reception application of a mobile terminal B, which has received a list of thumbnails. Specifically, Figure 5 (a) shows a direction towards which an online dialog window is scrolled. In general, a controller 180B outputs an in-line dialogue window 301 to scroll in the up-and-down direction 501-1. Figure 5 (b) shows a scrolling direction of a thumbnail list 400, which is output in a text bubble. The thumbnail list 400 according to one embodiment of the present invention may be scrolled in the left-to-right direction different from the up-down direction in response to a user's scroll command 10e. Subsequently, the controller 180B of the mobile terminal B 100B may further output an additional image request button 502 in a final portion of the thumbnail list 400. If the additional image request button 502 is selected 10f, the controller 180B may automatically transmit a request message for a different image to a sharing terminal (eg the mobile terminal A). An embodiment of receiving a request message for a different image via the additional image request button 502 is explained with reference to FIG. 6 in the following. Fig. 6 is a diagram of a method for controlling reception of an additional thumbnail list according to an embodiment of the present invention in the case of receiving a transmission request message of a different image . FIG. 6 (a) shows an execution state of a mobile terminal A 100A of a sharing side and FIGS. 6 (b) and (c) show an execution state of a mobile terminal B 100B of FIG. a requesting side.
[0048] If an input 10g touching an icon ("different image") included in an additional image request message, which is received from a mobile terminal B 100B, is received, as shown in Fig. 6 (a), The controller 180A again outputs a search result 308 and a user can designate a different image via the search result output 308.
[0049] Since a method of redesigning an image is identical to the method of naming an image mentioned earlier in FIG. 4, this time the explanation of the process of redesigning an image is omitted. The mobile terminal A 100A to which an image is again designated can transmit (repeat step S205) again a thumbnail image.
[0050] Referring to Fig. 6 (b), the mobile terminal B 100B can output a second thumbnail list 400-2 for the received thumbnail image of the mobile terminal A 100A. In this case, the controller 180B can output each item from the second thumbnail list 400-2 by scrolling the second thumbnail list in the left-to-right direction 501-2.
[0051] An embodiment of the present invention provides a method of easy switching between thumbnail lists. If it is assumed that a list of thumbnail images received again corresponds to a second thumbnail list 400-2 and a thumbnail image list received previously corresponds to a first thumbnail list 400-1, a switch between the first and second thumbnail list can be achieved based on a touch gesture made in a text bubble. In particular, if a touch-slip gesture 10h is received in a direction from top to bottom in a state of Fig. 6 (b) exiting the second thumbnail list 400-2, as shown in Fig. 6 (c), the controller 180B may output the first thumbnail list 400-1 by switching the second thumbnail list. At the same time, each item included in the thumbnail list 400 can be output via a scroll. On the other hand, if the number of articles is not small, it is necessary to have a function of sorting the articles in an appropriate order. The function is explained with reference to FIG. 7 in the following. Fig. 7 is a diagram of a sort control method of a thumbnail list in a mobile terminal of a requesting side, which wants to share an image, according to an embodiment of the present invention.
[0052] Fig. 7 (a) shows an execution state of outputting a thumbnail list via a touch screen 151B of a mobile terminal B 100B from a requesting side. The mobile terminal B outputs a thumbnail list for thumbnail images received from a mobile terminal A 100A of a sharing side. According to one embodiment of the present invention, a face recognition is applied to each thumbnail based on a prescribed received order and a thumbnail in which a face of a user of the mobile terminal B is included can be rearranged to a thumbnail. front part of a thumbnail list based on the applied face recognition. In particular, with respect to an applicant, the applicant may intend to preferentially check an image in which a face of the applicant is included. Figure 7 (b) shows an example of the prescribed order. Referring to the example shown in Fig. 7 (b), the prescribed order corresponds to a shaking gesture 10 of a mobile terminal 100, whereby the present invention may not be limited. At the same time, an embodiment of selecting a thumbnail from a thumbnail list shown in Figures 6 and 7 and requesting an original image for the selected thumbnail is explained with reference to the figure 8 in the following.
[0053] Fig. 8 is a diagram of a control method of selecting a thumbnail of a thumbnail list and requesting an original image for the selected thumbnail according to an embodiment of the present invention.
[0054] Figures 8 (a) and (b) show an execution state of a mobile terminal B 100B of a requesting side and Figure 8 (c) shows a state of a mobile terminal A 100A of a sharing side. As shown in Fig. 8 (a), if a selection entry 10k is received on a thumbnail list 400 of a user, the controller 180 may output a thumbnail 800 so as to be identified. If a request button is selected 10m after the thumbnail is selected, the controller 180B may transmit a request message for the selected image to the mobile terminal A 100A of the sharing side.
[0055] A sharing request message 801 and the selected thumbnail image may be outputted to an on-line dialogue window of the mobile terminal A 100A, which has received the request message. If a confirmation button 802 is selected 10n, an original image corresponding to the selected thumbnail can be transmitted to the mobile terminal B 100B on the requesting side.
[0056] At the same time, an embodiment of the present invention proposes to cause the mobile terminal A 100A of the sharing side to directly transmit an image (in the case of an image stored in the mobile terminal A) or to introduce an authorization to share an image stored in a cloud server of the mobile terminal A 100A in response to a request of the original image. In particular, instead of directly transmitting an image, it can grant an access right for a cloud server of the image only. An embodiment of the mobile terminal B 100B is explained to which an access right for a cloud server is granted, with reference to FIG. 9 in the following.
[0057] Fig. 9 is a diagram of an image storage control method according to an embodiment of the present invention when a computer cloud access right is set. One embodiment of the present invention proposes to allow a user to select a storage size of an image to which an access right is granted for a computing cloud. If an access right for a cloud server is granted by a user of a mobile terminal A 100A, a user of a mobile terminal B 100B on a requesting side may receive a message 901 indicating that the access right is established.
[0058] Having received the message 901, such mobile terminal B 100B may output a pop-up window 902 to select a storage size of an image when a thumbnail image to be downloaded is selected 10p. If one of the image storage sizes is selected from the pop-up window 902, the controller 180B requests the cloud server to download the image with the selected image size and can download a picture. At the same time, an embodiment of the present invention further provides a control method capable of more conveniently checking a thumbnail in a thumbnail list.
[0059] Fig. 10 is a diagram of a thumbnail output control method for enlarging the thumbnail in a thumbnail list according to an embodiment of the present invention. Referring to Fig. 10 (a), it shows a list of thumbnails 400 capable of being outputted to an online dialog window 301 and a text bubble 310-3 including the thumbnail list. If an outward pinch gesture 10q is received on a prescribed sticker article 401-4, the controller 180 may control the output of the prescribed sticker article or all of the sticker articles therefrom. magnifying (refer to Figure 10 (b)). In addition, an embodiment of the present invention proposes to output in addition detail information on the enlarged thumbnail article 401-4 in addition to the enlarged thumbnail article 401-4. The detail information 1001 corresponds to image metadata information and may include information about a location where the image is captured or information about a date on which the image is captured.
[0060] Referring to Fig. 10 (c), it shows the text bubble 310-3 output to include the magnified image and the detail information. Although a thumbnail image is magnified, a size of an area occupied by the thumbnail image is not significantly changed over an entire online dialog box area. From there, an interruption of message content of a different online dialog window can be minimized. On the contrary, it is explained a change of vignette for an inward pinching gesture in the following with reference to FIG.
[0061] Fig. 11 is a diagram of a summary information output control method on a thumbnail list according to an embodiment of the present invention. Referring to Fig. 10 (a), it shows a thumbnail list 400 capable of being outputted to an online dialog window 301 and a text bubble 310-3 including the thumbnail list. If an inward pinching gesture 10 is received on a prescribed thumbnail item 401-4, the controller 180 may control the output of a summary information 1101 on the thumbnail list (refer to FIG. 11 (b)).
[0062] Summary information 1101 may include the number of images or files included in the thumbnail list. In addition, according to an embodiment of the present invention, it can further output the sum 1104 of selected image data capability together with the summary information. This is because, as mentioned in the foregoing description, billing is performed according to transmitted and received data. Fig. 11 (c) shows an execution state of a mobile terminal B 100B of a requesting side. An embodiment of the present invention proposes to additionally output a file name together with a share request message 311-5. If an output file icon on the share request message 311-5 is touched 10t, the controller 180B may additionally output a file name 1102 corresponding to the file icon. At the same time, a second applicant can submit a request for an image, which is shared with a first applicant. This embodiment will be explained with reference to FIG. 12 in the following. Fig. 12 is a diagram of an image sharing request control method, which is shared between other mobile terminals, according to an embodiment of the present invention. Referring to Fig. 12 (a), it shows an execution state of a touch screen 151C of a mobile terminal C 100C of a requesting side. An online dialog window 301 is output. If an image is shared between other terminals, an embodiment of the present invention proposes to cause a controller 180C to output a 310-4 message for the shared image and a request button 1201 (FIG. refer to Figure 12 (a)). In particular, in this case, a thumbnail for the shared image may not be seen. On the other hand, a simple 310-4 guide message can be output.
[0063] If the request button 1201 is touched 10u, the controller 180C may transmit a request message 310-5 to a mobile terminal A 100A of a sharing side. Fig. 12 (b) shows a state of the mobile terminal A 100A of the sharing side and it can verify that the request message 310-5 is included in an online dialogue window 301. If a confirmation button (Yes) 1202 included 10 in the request message 310-5 is selected 10v, the controller 180A can transmit a thumbnail image to the mobile terminal C 100C (repeat step S205). Similarly, as shown in Fig. 12 (c), a text bubble 310-6 including a thumbnail list 400 is outputted to an online dialog window 301 of the mobile terminal C 100C and a user of the mobile terminal C 100C can select a thumbnail item from the thumbnail list. At the same time, as mentioned in the previous description, when an access right is set up for a cloud server, if a mobile terminal leaves an online dialog window, it may be preferable to no longer give the right. access to the mobile terminal. This embodiment will be explained with reference to FIG. 13 in the following. Fig. 13 is a diagram of an automatic access right change control method for a cloud server according to an embodiment of the present invention when a mobile terminal A 100A of a sharing side leaves a function online group dialogue. Referring to Figure 13 (a), it shows a list of participants participating in an online group dialogue function. As shown in Figure 13 (a), assume that a sharer (Younghee Lee) and two claimants (Chulsoo Kim and Gildong Hong) are included in the list of 30 participants. And, suppose that the access right for a cloud server is granted to the requester (Chulsoo Kim) for a specific image. If a button quit 1303 to quit an online chat room is selected 10w, a 180A controller on a splitter side can leave a 3031601 group online chat function (terminate an online chat window) and a right access to the computer cloud granted in the online dialog window can be configured to be automatically released. Figure 13 (b) shows an access right configuration screen according to a user. Referring to Fig. 13 (b), it can be verified that the right set for the requestor (Chulsoo Kim) is automatically changed to a release state (not accessible) 1305. At the same time, according to the embodiment Referring to Fig. 3, images are searched using a search key word and a control method (first example) of naming a plurality of images in a desired result has been explained. In the following Figures 14 to 21, various examples of designation of a plurality of images are explained. In an embodiment explained with reference to FIG. 14, assume that a face of an on-line dialogue peer is stored in advance in a mobile terminal A 100A of a sharing device.
[0064] If a specific gesture is received, a controller 180A searches for an image including the stored face of the peer and can output a search result 308. In this case, as mentioned earlier in Figure 3, the search result 308 can output the number of images, which are searched according to a category.
[0065] For example, the specific gesture may include a gesture 10x of shaking of the mobile terminal A 100A in an output state of an online dialogue window via a touch screen 151A. Although the embodiment explained with reference to FIG. 14 corresponds to a control method of searching an image based on faces of all the counterparts participating in an on-line dialogue function, it is also possible to designate a specific counterpart. As shown in Fig. 15 (a), if a handover gesture 10x of a mobile terminal while a message received from a homologous terminal of "Chulsoo Kim" is touched lOy is received, the controller 180 searches for images in which a face of a homologue ("Chulsoo Kim") is included and can output a search result 308. In the following, an example is explained, which is different from the example of FIG. to designate a specific homologue with reference to Figure 16 in the following.
[0066] At the same time, in the embodiment shown in Fig. 15, although a different hand of a hand holding the mobile terminal A 100A is used for touching 10y, it can touch a message using of an inch of the hand holding the mobile terminal.
[0067] Referring to FIG. 16 (a), if a 10z touch input of a 310-6 message received from a user's peer terminal ("Gildong Hong") and a touch-to-a-touch image designation 303 is received on an online dialog window 301 of the mobile terminal A 100A, the controller searches for images in which a user's face ("Gildong 10 Hong") corresponding to a transmission entity received image 3106 is included and can output a search result 308. At the same time, the specific peer designation control method, which is explained with reference to FIG. 16, can be used in a meaningful manner. to be combined with a different search process.
[0068] For example, as shown in Fig. 17 (a), when a first search result 308-1 is output based on a prescribed search keyword such as "Jeju Island", if a prescribed counterpart (Gildong Hong) is designated, the controller searches for images in which a face of the counterpart is again included and can output a second search result 308-2. In this case, in order to designate the prescribed counterpart, a gesture 10aa of touching a received message 310-6 and touch slipping to the first search result 308-1 can be used. At the same time, if face information on a peer is not stored, a face of a peer can be directly captured by activating a camera. This embodiment will be explained with reference to FIG. 18 in the following. Fig. 18 is a diagram of a control method of capturing an image via an activated camera and designating a plurality of images based on the captured image according to an embodiment of the present invention. In Figure 18 (a), assume a situation where a requestor 30 requests an image in which the requestor is captured for a user. If the user does not have information about an applicant's face, the user can capture an image by immediately activating a camera.
[0069] Referring to Fig. 18 (a), a control member 180A of a mobile terminal A 100A of a sharing side receives a selection input of an image designation button 303. If the Image designation 303 is selected 10bb, the controller 100A automatically activates the camera 121 and can capture a face of a peer using the camera based on a capture command from the user. If the face of the peer is captured, the controller 180A searches for images based on the captured face of the peer and can output a search result 308.
[0070] In the embodiment mentioned in the preceding description, a method of ordering designation of a plurality of images in an image search result 308 at one time has been explained. In the following, we explain a method of ordering individual designation of an image via a list of thumbnails.
[0071] Fig. 19 is a diagram of a method of controlling individual designation of an image via a thumbnail list according to an embodiment of the present invention. As shown in Fig. 19 (a), if an image designation button 303 is selected 10cc and a prescribed icon 304-6 is selected, as shown in Fig. 19 (b), the controller 180 can output a list of thumbnails. It may further output a storage position indicator 1901 on each thumbnail item 1900 to indicate a position at which each thumbnail item is stored. An indicator such as "Gallery" can be output to an image stored via a gallery application on the mobile terminal 100. An indicator such as "Google" can be output to an image stored via Google Drive. An indicator such as Naver "can be output on an image stored with N Drive. If a touch gesture is received on a prescribed thumbnail article 1900, as shown in Fig. 19 (b), the controller 180 may configure an image corresponding to the thumbnail article 1900 to be transmitted or shared to / with a peer terminal on an in-line dialogue window 301. According to a different embodiment of the present invention, if a touch gesture 10ee is received on a prescribed sticker article 1900, as shown in FIG. 19 (c) the controller 180 may output a pop-up window 1902 to designate a peer terminal with which the image is to be shared. If a prescribed peer terminal is selected via the output pop-up window 1902, the controller may control the transmission or sharing of an image corresponding to the thumbnail article 1900 to / with the prescribed peer terminal selected on the In the following, a method of ordering the designation of an image according to a date with reference to FIG. 20 is explained. FIG. 20 is a diagram of a method for controlling the designation of an image. image according to a date in accordance with an embodiment of the present invention. Referring to Fig. 20 (a), a controller 180A of a mobile terminal A 100A of a sharing side outputs a list of thumbnails 2000. If an order lOgg to touch a scroll bar of the list thumbnail output 200 is received, the controller 180A can additionally output a date 2001 15 corresponding to the scroll bar. As shown in Figure 20 (b), if the lOgg touch is slipped 10h, we can change a date to a 2011 date corresponding to a slipped position. If a prescribed time elapses while the lOgg touch is maintained, the controller 180A can automatically output images corresponding to the date 20 as a search result 308 (refer to Fig. 20 (c)). At the same time, a designation of the images can be made based on a location at which the images are captured. In the following, we explain a method of ordering designation of an image based on a location at which the image is captured with reference to Figure 21.
[0072] Fig. 21 is a diagram of an image designation control method based on a location at which the image is captured. Referring to Fig. 21 (a), a controller 180A of a mobile terminal A 100A of a sharing side outputs a thumbnail list 2000. If a touch gesture 10jj is received on a specific item 1900, such as 30 shows Fig. 21 (b), the controller 180A can output a capture location 2102 corresponding to the specific item 1900. If the picked out slot 2102 is selected 10kk, the controller 180A searches for images captured in a location identical to the capture location 2102 and can output a search result 308 (refer to Figure 21 (c)). At the same time, despite the identical capture location 2102, it may be unnecessary to search for location information with a completely identical condition. In particular, captured images can be searched within a prescribed radius from the capture location 2102 or images from a location belonging to an administrative region identical to the capture location 2102. At the same time, it is explained with reference to Figs. 22-25 in the following a different embodiment of naming an image based on a location at which the image is captured and a date on which the image is captured. Figs. 22-25 are diagrams of an image designation control method based on a location at which the image is captured and a date on which the image is captured according to an embodiment. of the present invention. Referring to Fig. 22 (a), a controller 180A of a mobile terminal A 100A of a sharing side outputs a list of thumbnails 2000. If an input 10mm of selection of a specific item 1900 is received, the controller 180A may output a popup window 2200 including a date 2201 to which the specific item 1900 is captured and a location 2202 at which the specific item is captured. A method of searching for an image is explained by controlling the popup window 220 with reference to FIGS. 23 and 24 in the following.
[0073] An embodiment of the present invention proposes to use a capture date 2201 and a capture location 2202 as an image search condition and cause a user to enable or disable the image search condition. . Figs. 23 (a) and (b) show a state in which a search condition of the capture date 2201 is deactivated. In order to indicate that the capture date is disabled, a strike is displayed on the capture date 2201. Referring to Figure 23 (a), a capture location 2202 such as' Ora-dong Jeju-si Jeju island Is enabled as a search condition and 3031601 captured images at capture location 2202 are output according to a date file. When a capture location 2202 such as "Ora-dong Jeju-si Jeju Island" is activated as a search condition, if "Jeju-si" is selected, the controller 180A excludes a segmented administrative region such as that "Ora-dong" from the search condition (grading 2303, Figure 23 (b)), searches for captured images in "Jeju-si" and can output the searched images. In this case, as shown in Figure 23 (b), a name of an administrative region such as "Jeju-si" can be displayed so that it can be identified 2301.
[0074] If an input lOpp for selecting a date file (9/7 (Sunday)) is received, the controller 180 may output a detail image thumbnail corresponding to the date. Figs. 24 (a) and (b) show a state for which a search condition of the capture location 2202 is disabled. In order to indicate that the capture location is disabled, a grading is displayed on the capture location 2202. When a capture date 2202 such as "September 5, 2014 17:50" is enabled as a condition of If "5" is selected, the controller 180A excludes a segmented time such as "17:30" from the search condition and may output a strike as shown in Fig. 23 (a). Subsequently, the controller sets "5 September 2014" as the reference date and can output images corresponding to a date close to the reference date including the reference date according to a file (2402-1 to 2402-4). The date can be set based on a touch gesture 1Orr received on 25 "5". If the date is set, as shown in Figure 24 (b), folders can be sorted based on the set date. Fig. 25 is a diagram of a method for ordering the designation of pictures distinguished from each other according to a folder as a sharing picture according to an embodiment of the present invention.
[0075] Referring to Figs. 25 (a) and (b), as mentioned earlier in Fig. 24, they show a result that images are searched based on capture date 2201 and capture location 2202. If a confirmation button 2501 is selected, the controller 180 searches for images corresponding to the condition (2201/2202) and can output a search result 308. At the same time, according to the embodiment mentioned above, Early in reference to FIG. 3, images are searched based on a search keyword entered by a sharer. In the following, we explain with reference to Figure 26 an image search control method of a sharing side so that a requestor enters a keyword directly. Fig. 26 is a diagram of a method of controlling reception of a search keyword from a requesting side and searching for an image based on a search keyword according to a search mode. embodiment of the present invention. Referring to Fig. 26 (a), it shows an online dialogue window of a mobile terminal B 100B of a requesting side. If a prescribed button is selected 10uu after a search keyword 2601 is entered, the search keyword 2601 can be issued to a mobile terminal A 100A of a sharing side.
[0076] Figs. 26 (b) and (c) show a state of the mobile terminal A 100A of the sharing side to which the search keyword 2601 is issued. As shown in Fig. 26 (b), the controller 180A may output a popup window 2602 on an online dialog window 301 to query whether to search for images based on the search keyword. 2601.
[0077] If a confirmation button is selected lOvv, the controller 180 searches for images based on the search keyword 2601 issued and can output a search result 308. At the same time, a keyword of search can also be designated on a received message. This will be explained with reference to Figure 27 in the following.
[0078] Fig. 27 is a diagram of a method of controlling the designation of a search keyword in a received message according to an embodiment of the present invention. Referring to Fig. 27, it shows an online dialog window 301 of a mobile terminal A 100A of a sharing side. As shown in Fig. 27 (a), if an outward pinching gesture 10ww is received on a received message 310-7, the controller 180 may display a first word 2701-1 included in the received message. 310-7.
[0079] If a 10xx switching gesture is received on the received message 310-7, as shown in Fig. 27 (c), the controller 180 can output a second word 2101-2 instead of the first word 2101-1. so as to switch from the first word to the second word.
[0080] If a gesture of touching lOyy designating a prescribed counterpart is received while the second word 2101-2 is touched, the controller 180A may control the search for the images based on the second word 2101-2 and the counterpart prescribed. In the aforementioned embodiment, a method of controlling designation of a date or location for selecting a plurality of images at the same time has been explained. In the following description, there is provided a more convenient designation control method of a date or location. In addition, an image search control method based on a person captured in an image is further explained in addition to the image search control method based on date and location. The date, location, and person referred to above are referred to as an image search category. In particular, if a user selects a date category, the user designates a preferred date and may then be able to use the preferred date as a search condition for an image.
[0081] Fig. 28 is a diagram of a change control method of an image search condition category according to an embodiment of the present invention. According to one embodiment of the present invention, if a touch-slip input is received in a first direction, the present invention proposes to change a category of image search condition. Referring to Fig. 20 (a), the controller 180A of the mobile terminal 100A of the sharing side outputs a list of thumbnails 2000. The thumbnail list may correspond to a list of thumbnails 2000 output on a gallery application. If a touch-slip entry is received on the thumbnail list output 2000 or a run screen of the gallery application in a first direction, the controller 180A may change a category of a search condition images in a second category from a first category. As an example shows, if a touch-slip input is received in a horizontal direction of a screen, the image search condition can be changed to a slot category from a date category. . This will be explained in more detail according to a touch step. Referring to Fig. 28 (a), if an order lOzz for touching a scroll bar 2803 of the thumbnail list output 2000 is received, the controller 180AS may further output a date 2001 corresponding to a point 2803 In particular, suppose that a category of an image search condition is set 2801 by a date. If a slip entry for dragging the touch point of the scroll bar 2803 is received, as mentioned in the previous description, the date can be changed. If an entry 10ab sliding in a first direction (horizontal direction in the drawing) is received while the lOzz touch is maintained, the controller 180A can change the category 2802 of the image search condition to the second category (in the example, from date to location) from the first category (switching from Figure 28 (a) to (b)). As shown in an example of Fig. 28 (b), the controller may output an image search condition corresponding to the slot category. Referring to Figure 28 (b), "City Hall" 2804 is shown as an example of the image search condition. If an entry sliding in a second direction is received, the above-mentioned image search condition may be switched to a different condition (e.g., from Jeju Island City Hall) (will be described with reference to Figure 29) . The image search condition (s) corresponding to the category of location may be retrieved from image tag information stored in memory 170. Subsequently, if the slip entry in the first direction is continuously input lOac, the controller 180A can change the image search condition of the second category to a third category (in the example, from location to person) (in switching from Figure 28 (b) to (c)). As shown in an example of Fig. 28 (c), the controller may output an image search condition corresponding to a category of person. In Fig. 28 (c), "John" 2806 is shown as an example of the image search condition. In particular, if the image search condition such as "John" 2806 is selected, the controller 180A searches for matching images with a person such as John (i.e., including a person such as than John) and can then provide the images to a user. Similarly, if an entry sliding in a second direction is received, the above-mentioned image search condition may be switched to a different condition (eg from John to Kim). Hereinafter, a method of switching control between image search conditions in a switched location category in response to a touch-and-slip input in a second direction is explained in more detail with reference to FIG. . Fig. 29 is a diagram of a switching control method between image search conditions in a switched location category in response to a touch-slip input in a second direction according to an embodiment of the present invention. invention.
[0082] Referring to Fig. 29 (a), a slot category is selected by the order method mentioned earlier in Fig. 28 and a first image search condition such as "City Hall" 2804 is designated . If an input 10AD sliding in a second direction (in one example, a vertical direction) is received while a touch 1 Ozz is maintained, the controller 180A can switch the first image search condition in one second. Image search condition (in the example, from Jeju Island City Hall). As mentioned in the foregoing description, the image search condition included in the slot category can be retrieved from image tag information stored in memory 170.
[0083] If a slip input is continuously received while the lOzz touch is maintained, it appears that the second image search condition is changed to a third image search condition. In the following, when an image search condition is designated, a method of controlling the supply of a plurality of images corresponding to the designated image search condition is explained with reference to FIG. Fig. 30 is a diagram of a method for controlling delivery of a plurality of images corresponding to a designated image search condition 3031601 when the image search condition is designated according to an embodiment of the present invention. . Referring to Fig. 30 (a), this shows a state of the image search condition mentioned earlier in Fig. 29. In an example shown in Fig. 30 (a), the search condition is switched to "Jeju Island" 2804. In this switched state, if an order (for example, a long touch input holding a lOzz key for longer than a prescribed time) to designate the search condition of image is received from a user, as shown in FIG. 30 (b), the controller 180A can control the automatic selection 3001 of at least one or more images corresponding to the designated image search condition . For example, as shown in Figure 30 (b), selected images can be images with a checkbox automatically selected. After the checkbox is selected / released by a user, if a send button 3002 is selected, the controller 180A may transmit the selected image (s) to a peer terminal. If the peer terminal is not yet designated (for example if a plurality of images are selected on a gallery application), as shown in Figure 30 (c), the controller may output a list of peers. 3003 to designate a peer terminal. At the same time, as mentioned in the preceding description, in the case of the transmission of a plurality of images to a designated counterpart, it is possible to transmit thumbnail images of a plurality of the images only instead of directly transmitting a plurality of images. images according to an embodiment of the present invention. If a peer terminal, which has received the thumbnail images only, designates a preferred one of a plurality of images and requests to transmit the preferred image, it may transmit image data for the designated image only. As mentioned in the foregoing description, preferentially transmitted thumbnail images may be outputted to an in-line dialog window 301 in the form of a thumbnail list 400 mentioned earlier in FIG. 4 (c). Various embodiments may be implemented using a machine readable medium on which instructions for execution by a processor are stored to perform various methods presented herein. Examples of machine-readable media support include HDD, DSC, DDS, ROM, RAM, CD-ROM, a magnetic tape, a floppy disk, a optical data storage device, the other 5 types of storage media presented herein, and combinations thereof. If desired, the machine-readable medium may be embodied as a carrier wave (e.g., a transmission over the Internet). The processor may include the controller 180 of the mobile terminal. The foregoing embodiments and advantages are merely illustrative and should not be construed as limiting the present invention. The present teachings can easily be applied to other types of apparatus. This description is intended to be illustrative, not to limit the scope of the claims. Many variations, modifications and variations will occur to those skilled in the art. The features, structures, methods, or other features of the embodiments described herein may be combined in various ways to provide additional and / or alternative embodiments. Since the present features may be embodied in various forms without departing from their features, it should also be understood that the embodiments described above are not limited to any of the details of the foregoing description unless otherwise specified, but should be rather, to be broadly defined in its scope as defined in the appended claims, and therefore any changes and modifications that fall within the scope and limits of the claims, or equivalents of such boundaries and limits, are therefore intended to be encompassed by the appended claims.
[0084] Of course, the invention is not limited to the embodiments described above and shown, from which we can provide other modes and other embodiments, without departing from the scope of the present invention. invention.
权利要求:
Claims (20)
[0001]
REVENDICATIONS1. A mobile terminal (100), comprising: a touch screen; a memory (170) configured to store a plurality of images; a wireless communication unit (110) configured to transmit-receive data with a peer terminal; and a controller (180), if an input for selecting partial images of a plurality of the stored images is received, configured to control the wireless communication unit (110) to transmit thumbnail images for the selected partial images to the peer terminal, the controller (180) being configured to control the touch screen to output an on-line dialog window containing a history of transmitted-received messages with the peer terminal.
[0002]
The mobile terminal (100) according to claim 1, wherein the input for selecting the partial images from a plurality of the images corresponds to a designation input of an image search condition and wherein the Selected partial images correspond to an image (s) satisfying the designated image search condition.
[0003]
The mobile terminal (100) of claim 2, wherein the controller (180) is configured to control the touch screen to output a thumbnail list corresponding to a plurality of the stored images, wherein the input for selecting the partial images from a plurality of the images comprises a touch-slip input in a first direction received on the output thumbnail list and wherein if the touch-slip input is received in the first direction, the controller (180) is configured to switch a category for the image search condition between a capture date, a person and a location.
[0004]
The mobile terminal (100) according to claim 3, wherein the input for selecting the partial images from a plurality of the images comprises a touch-slip input in a second direction received on the list of 3031601 53 thumbnails output, wherein if the touch-slip input is received in the second direction, an image search condition belonging to the switched category is switched and in which the stored image search condition matches a search condition of images switched by the touch-slip input input in the second direction.
[0005]
The mobile terminal (100) according to claim 1, wherein the controller (180) is configured to further output a thumbnail list for the thumbnail images transmitted on the online dialog window.
[0006]
The mobile terminal (100) according to claim 5, wherein the controller (180) is configured to scroll the message output history in a first direction and scroll the thumbnail list in a second direction different from the first direction.
[0007]
The mobile terminal (100) according to claim 5, wherein the controller (180) is configured to output each of the transmitted-received messages as a text bubble and configured to cause the thumbnail list to to be contained in a text bubble. 20
[0008]
The mobile terminal (100) of claim 1, wherein if a request message for the partial images among the transmitted thumbnail images is received, the controller (180) is configured to control the wireless communication unit. (110) to transmit an image corresponding to the request received at the peer terminal.
[0009]
The mobile terminal (100) according to claim 1, wherein if a request message for the partial images among the transmitted thumbnail images is received, the controller (180) is configured to control the communication unit without wire (110) for uploading an image corresponding to the received request to a sharing server and establishing a sharing right for the uploaded image for the peer terminal. 15 3031601 54
[0010]
The mobile terminal (100) of claim 9, wherein if the peer terminal exits the online dialog window, the controller (180) is configured to control the wireless communication unit (110) to release the sharing right introduced for the peer terminal. 5
[0011]
11. A method of controlling a mobile terminal (100), comprising the steps of: storing a plurality of images in a memory; transmission / reception of data with a peer terminal; if an entry for selecting images from a plurality of the stored images is received, transmitting the thumbnail images for the selected partial images to the peer terminal; and outputting an online dialog window containing a history of the transmitted-received messages with the peer terminal.
[0012]
The method according to claim 11, wherein the input for selecting the partial images from a plurality of the images corresponds to a designation input of an image search condition and in which the selected partial images correspond to an image (s) satisfying the designated image search condition.
[0013]
The method of claim 12, further comprising the steps of: outputting a thumbnail list corresponding to a plurality of the stored images, wherein the input for selecting the partial images from a plurality of the images comprises a touch-slip input in a first direction received on the output thumbnail list; and if the touch-slip input is received in the first direction, switching a category for the image search condition between a capture date, a person and a location.
[0014]
The method of claim 13, wherein the input for selecting the partial images from a plurality of the images comprises a touch-and-slide input in a second direction received on the output thumbnail list, wherein if touch-slip input is received in the second direction, the method further comprises the step of switching an image search condition belonging to the switched category and wherein the stored image search condition corresponds to at an image search condition switched by the touch-slip input input in the second direction. 5
[0015]
The method of claim 11, wherein the outputting step further outputs a thumbnail list for the thumbnail images transmitted on the online dialog window. 10
[0016]
The method of claim 15, wherein the outputting step scrolls the message output history in a first direction and scrolls the thumbnail list in a second direction different from the first direction.
[0017]
The method of claim 15, wherein the outputting step outputs each of the transmitted-received messages as a text bubble and causes the thumbnail list to be contained in a text bubble.
[0018]
The method of claim 11, if a request message for the partial images among the transmitted thumbnail images is received, further comprising the step of transmitting an image corresponding to the received request to the peer terminal.
[0019]
The method of claim 11, if a request message for the partial images among the transmitted thumbnail images is received, further comprising the steps of: uploading an image corresponding to the received request to a sharing server; and establishing a right to share the uploaded image for the peer terminal. 30
[0020]
20. The method of claim 19, if the peer terminal leaves the online dialogue window, further comprising the step of releasing the sharing right introduced for the peer terminal.
类似技术:
公开号 | 公开日 | 专利标题
FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3022368B1|2019-06-21|WATCH-TYPE TERMINAL AND CONTROL METHOD THEREOF
FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3021767A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME
FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3026201A1|2016-03-25|
FR3021135A1|2015-11-20|
FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME
FR3021425A1|2015-11-27|
FR3022649A1|2015-12-25|
FR3022367A1|2015-12-18|
FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3022648A1|2015-12-25|
FR3046470B1|2019-11-08|MOBILE TERMINAL
FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021485A1|2015-11-27|MOBILE DEVICE AND METHOD OF CONTROLLING THE SAME
FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3019665A1|2015-10-09|
FR3028630A1|2016-05-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME
FR3029309A1|2016-06-03|
FR3022370A1|2015-12-18|MOBILE TERMINAL
同族专利:
公开号 | 公开日
CN105791536B|2020-11-20|
FR3031601B1|2019-08-30|
US9990124B2|2018-06-05|
US20160202889A1|2016-07-14|
CN105791536A|2016-07-20|
EP3046314A1|2016-07-20|
KR20160087640A|2016-07-22|
EP3046314B1|2019-04-03|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
WO2005109820A1|2004-05-12|2005-11-17|Nokia Corporation|Selecting and transmitting files to terminal|
US20080266378A1|2007-04-26|2008-10-30|Lg Electronics Inc.|Mobile communication device capable of storing video chatting log and operating method thereof|
US20140213318A1|2013-01-31|2014-07-31|Lg Electronics Inc.|Mobile terminal and controlling method thereof|
US20140358969A1|2013-05-31|2014-12-04|Xilopix|Method for searching in a database|
JP2006040050A|2004-07-28|2006-02-09|Olympus Corp|Reproduction device, camera and display switching method for reproduction device|
US20090094048A1|2005-06-20|2009-04-09|Engage Corporation|System and Method For Facilitating The Introduction of Compatible Individuals Using Third Party Collaborative Tools|
JP4375442B2|2007-06-04|2009-12-02|ソニー株式会社|Image management apparatus, image management method, and image management program|
CN101377776B|2007-08-29|2010-06-30|中国科学院自动化研究所|Method for searching interactive image|
JP5324143B2|2008-07-01|2013-10-23|キヤノン株式会社|Display control apparatus and display control method|
JP5471124B2|2009-07-29|2014-04-16|ソニー株式会社|Image search apparatus, image search method, and image search program|
KR101852818B1|2011-04-29|2018-06-07|엘지전자 주식회사|A digital receiver and a method of controlling thereof|
US8639296B2|2011-06-07|2014-01-28|Lg Electronics Inc.|Mobile device and an image display method thereof|
CN102841896A|2011-06-22|2012-12-26|腾讯科技(深圳)有限公司|Mobile terminal picture and text information displaying method and system and mobile terminal|
CN103179156B|2011-12-22|2018-01-16|腾讯科技(深圳)有限公司|A kind of picture sharing method, system and equipment|
JP5467123B2|2012-05-30|2014-04-09|株式会社ソニー・コンピュータエンタテインメント|Information processing apparatus and information processing method|
KR20140069943A|2012-11-30|2014-06-10|삼성전자주식회사|Apparatus and method for processing a contents of portable terminal|
JP6168882B2|2013-07-04|2017-07-26|キヤノン株式会社|Display control apparatus, control method thereof, and control program|US8106856B2|2006-09-06|2012-01-31|Apple Inc.|Portable electronic device for photo management|
US8698762B2|2010-01-06|2014-04-15|Apple Inc.|Device, method, and graphical user interface for navigating and displaying content in context|
KR102138515B1|2013-10-01|2020-07-28|엘지전자 주식회사|Mobile terminal and method for controlling thereof|
AU2016215440B2|2015-02-02|2019-03-14|Apple Inc.|Device, method, and graphical user interface for establishing a relationship and connection between two devices|
CN112152911A|2015-02-16|2020-12-29|钉钉控股(开曼)有限公司|Communication method and mobile device|
WO2016144385A1|2015-03-08|2016-09-15|Apple Inc.|Sharing user-configurable graphical constructs|
US10275116B2|2015-06-07|2019-04-30|Apple Inc.|Browser with docked tabs|
US10445425B2|2015-09-15|2019-10-15|Apple Inc.|Emoji and canned responses|
CN105681056B|2016-01-13|2019-03-19|阿里巴巴集团控股有限公司|Object distribution method and device|
CN105812237B|2016-03-07|2020-12-04|钉钉控股(开曼)有限公司|Method and device for quickly adding reminding object|
CN107305459A|2016-04-25|2017-10-31|阿里巴巴集团控股有限公司|The sending method and device of voice and Multimedia Message|
CN111176509B|2016-05-18|2022-01-25|苹果公司|Applying confirmation options in a graphical messaging user interface|
US10331336B2|2016-05-18|2019-06-25|Apple Inc.|Devices, methods, and graphical user interfaces for messaging|
AU2017100667A4|2016-06-11|2017-07-06|Apple Inc.|Activity and workout updates|
US10785175B2|2016-06-12|2020-09-22|Apple Inc.|Polling extension application for interacting with a messaging application|
DK201670609A1|2016-06-12|2018-01-02|Apple Inc|User interfaces for retrieving contextually relevant media content|
US10368208B2|2016-06-12|2019-07-30|Apple Inc.|Layers in messaging applications|
US10852912B2|2016-06-12|2020-12-01|Apple Inc.|Image creation app in messaging app|
US10873786B2|2016-06-12|2020-12-22|Apple Inc.|Recording and broadcasting application visual output|
KR101838074B1|2016-08-02|2018-03-13|엘지전자 주식회사|Terminal and method for controlling the same|
US10521107B2|2016-09-24|2019-12-31|Apple Inc.|Devices, methods, and graphical user interfaces for selecting and interacting with different device modes|
KR20180058476A|2016-11-24|2018-06-01|삼성전자주식회사|A method for processing various input, an electronic device and a server thereof|
CN107885443B|2017-09-22|2021-02-19|创新先进技术有限公司|Information processing method and device|
JP2019091238A|2017-11-14|2019-06-13|富士ゼロックス株式会社|Information processing system, information processing device, and program|
KR102061787B1|2017-11-29|2020-01-03|삼성전자주식회사|The Electronic Device Shooting Image and the Method for Displaying the Image|
CN110456971B|2018-05-07|2021-11-02|苹果公司|User interface for sharing contextually relevant media content|
DK180171B1|2018-05-07|2020-07-14|Apple Inc|USER INTERFACES FOR SHARING CONTEXTUALLY RELEVANT MEDIA CONTENT|
CN109788131A|2018-12-28|2019-05-21|努比亚技术有限公司|Notification message processing method, terminal and computer readable storage medium|
DK201970535A1|2019-05-06|2020-12-21|Apple Inc|Media browsing user interface with intelligently selected representative media items|
US11074408B2|2019-06-01|2021-07-27|Apple Inc.|Mail application features|
US11194467B2|2019-06-01|2021-12-07|Apple Inc.|Keyboard management user interfaces|
CN110737382A|2019-09-29|2020-01-31|维沃移动通信有限公司|picture management method and communication client|
KR20210070623A|2019-12-05|2021-06-15|엘지전자 주식회사|An artificial intelligence apparatus for extracting user interest and method for the same|
法律状态:
2016-05-30| PLFP| Fee payment|Year of fee payment: 2 |
2017-05-30| PLFP| Fee payment|Year of fee payment: 3 |
2017-12-22| PLSC| Publication of the preliminary search report|Effective date: 20171222 |
2018-05-29| PLFP| Fee payment|Year of fee payment: 4 |
2019-04-19| PLFP| Fee payment|Year of fee payment: 5 |
2020-04-08| PLFP| Fee payment|Year of fee payment: 6 |
2021-04-09| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
申请号 | 申请日 | 专利标题
KR1020150006861A|KR20160087640A|2015-01-14|2015-01-14|Mobile terminal and method for controlling the same|
KR20150006861|2015-01-14|
[返回顶部]